Entropy and Relative Entropy From Information-Theoretic Principles

نویسندگان

چکیده

We introduce an axiomatic approach to entropies and relative that relies only on minimal information-theoretic axioms, namely monotonicity under mixing data-processing as well additivity for product distributions. find these axioms induce sufficient structure establish continuity in the interior of probability simplex meaningful upper lower bounds, e.g., we every entropy satisfying must lie between Rényi divergences order 0 ∞. further show simple conditions positive definiteness such a characterisation terms variant trumping. Our main result is one-to-one correspondence entropies.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Quantum Dynamical Entropy from Relative entropy

In this exposition we prove an existence theorem for a quantum mechanical dynamical entropy based on von-Neumann’s measurement theory. To that end we introduced a Shannon type of information associated with a quantum channel or measurement based on Araki’s relative entropy. This is an invariance for the dynamics which generalizes Kolmogorov-Sinai ’s notion of dynamical entropy of a measure pres...

متن کامل

Information, relative entropy of entanglement, and irreversibility.

Previously proposed measures of entanglement, such as entanglement of formation and assistance, are shown to be special cases of the relative entropy of entanglement. The difference between these measures for an ensemble of mixed states is shown to depend on the availability of classical information about particular members of the ensemble. Based on this, relations between relative entropy of e...

متن کامل

Information-Theoretic Learning Using Renyi’s Quadratic Entropy

Learning from examples has been traditionally based on correlation or on the mean square error (MSE) criterion, in spite of the fact that learning is intrinsically related with the extraction of information from examples. The problem is that Shannon’s Information Entropy, which has a sound theoretical foundation, is not easy to implement in a learning from examples scenario. In this paper, Reny...

متن کامل

Information Theoretic Interpretations for H∞ Entropy

Based on the studies on information transmission in discrete multivariable linear time invariant (LTI) system disturbed by stationary noise, relations within entropy rate, mutual information rate and H∞ entropy are discussed in both general control problem and classic tracking problem. For the general control systems, equivalent relations between entropy rate and H ∞ entropy are formulated by u...

متن کامل

entropy, negentropy, and information

evaluation criteria for different versions of the same database the concept of information, during its development, is connected to the concept of entropy created by the 19th century termodynamics scholars. information means, in this view point, order or negentropy. on the other hand, entropy is connected to the concepts as chaos and noise, which cause, in turn, disorder. in the present paper, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Information Theory

سال: 2021

ISSN: ['0018-9448', '1557-9654']

DOI: https://doi.org/10.1109/tit.2021.3078337